Web Survey Bibliography
Traditionally, business data for official statistics have been collected with paper questionnaires in self administrative surveys. Nowadays the paper questionnaire is more and more replaced by web questionnaires. A variety of strategies can be followed to introduce the web in business surveys. In Norway in 2004, and recently in Denmark, it has been decided that all business surveys should be transformed to the web quickly. In the Netherlands an effort has been made to develop a well-designed web questionnaire: the Structural Business Survey questionnaire was fully designed and pre-tested in a two-year period. The result was supposed to serve as an example for all other surveys. A driving force behind this development is a common interest by the surveyors and those who are surveyed to reduce the manpower, and hence the costs of business surveys (including response burden).
But neither these ambitions nor quality improvements come automatically with technological innovations. At international conferences, workshops and meetings, we find that many methodologists are struggling with the implementation of these technologies. In February 2010, methodologist from 8 European countries met in Copenhagen to discuss how common EU-regulated surveys best can be transferred from paper to web (both for business and social surveys; the focus was on business surveys). The idea for this meeting was born when data collection methodologists from Statistics Denmark visited Statistics Netherlands in May 2009 to discuss web questionnaire designs. The initiative to organise this meeting was taken at the 2009 ISM Workshop in Bergamo. In follow-up to the Copenhagen meeting, this topic was also at the agenda of the Eurostat Working Group of Statistical Quality in June 2010. Here it was decided to discuss the need for an action plan and concrete projects with the Directors of Methodology.
This 2011 ISM presentation is a follow-up of the Copenhagen initiative, and is meant to report back to the participants what has been done. In the presentation we will give an overview of the issues that have been discussed, and relate those to non-sampling errors like non-response and measurement issues, as well as response burden. We would like to discuss with the audience how the Copenhagen initiative and the issues raised best could be followed up.
Issues that have been discussed (and which have relations to other presentations in the Workshop) are:
– An issue that is discussed over and over again is how to get sampled units pick-up the web questionnaire: What strategies should be used to increase the take-up rate? Should a paper questionnaire still be available, and presented?
– Where are we when it comes to guidelines in how to design web questionnaires? One much discussed issue under this headline is how similar or different web and paper questionnaires in a mixed mode data collection design should be. When respondents use a web questionnaire, they expect it to have some intelligence. What do respondents expect and what guidelines can be given to make the questionnaire respondent friendly? Issues here are e.g. the use of matrix questions and edit checks that help to get good data quality but may also result in aborting the completion of the questionnaire. Another issue is the use of historic data in the questionnaire (comparable to dependent interviewing).
– Talking about mixed-mode designs: How to deal with mode effects?
– Technology issues include e.g. how to deal with the variety of software browsers?
– How to implement web questionnaires? When moving to the web, Statistics Netherlands on the one side, and Statistics Norway and Statistics Denmark on the other, adopted different approaches (as discussed above). What did we learn from these two approaches?
– Once a questionnaire has been developed, the issue is: How do new methods affect pre-tests and the ability to monitor the response process?
During the development of web questionnaires traditional cognitive interviewing and the techniques of usability studies can be combined, e.g. by using paradata and eye-tracking during individual tests. Computerized questionnaires also opens the possibility to monitor the response process in a detailed way while conducting the survey (using paradata), both at the level of overall response rates as well at the level of individual respondents (using audit trails).
In our presentation we will focus in some more detail on web pick-up issues and response burden issues.
Issues that have not yet been addressed, but which are important and can be discussed at the workshop, are:
– How to organising the data collection and logistics for web and mixed-mode surveys?
– What software should be used: develop ones own software or use software that is available in the market?
– How to organise research, collaborate with universities, and bring in the literature?
Workshop Homepage (abstract) / (presentation)
Web survey bibliography (364)
- Displaying Videos in Web Surveys: Implications for Complete Viewing and Survey Responses; 2017; Mendelson, J.; Lee Gibson, J.; Romano Bergstrom, J. C.
- Usability Testing for Survey Research; 2017; Geisen, E.; Romano Bergstrom, J. C.
- Where, When, How and with What Do Panel Interviews Take Place and Is the Quality of Answers Affected...; 2017; Niebruegge, S.
- Nonresponses as context-sensitive response behaviour of participants in online-surveys and their relevance...; 2017; Wetzlehuetter, D.
- Do distractions during web survey completion affect data quality? Findings from a laboratory experiment...; 2017; Wenz, A.
- Predicting Breakoffs in Web Surveys; 2017; Mittereder, F.; West, B. T.
- Comparing acquiescent and extreme response styles in face-to-face and web surveys; 2017; Liu, M.; Conrad, F. G.; Lee, S.
- Respondent mode choice in a smartphone survey ; 2017; Conrad, F. G., Schober, M. F., Antoun, C., Yan, H. Y., Hupp, A., Johnston, M., Ehlen, P., Vickers, L...
- Effects of Mobile versus PC Web on Survey Response Quality: a Crossover Experiment in a Probability...; 2017; Antoun, C.; Couper, M. P.; G. G.Conrad, F. G.
- Methods for Evaluating Respondent Attrition in Web-Based Surveys; 2016; Hochheimer, C. J.; Sabo, R. T.; Krist, A. H.; Day, T.; Cyrus, J.; Woolf, S. H.
- Mobile-only web survey respondents; 2016; Lugtig, P. J.; Toepoel, V.; Amin, A.
- Using official surveys to reduce bias of estimates from nonrandom samples collected by web surveys; 2016; Beresovsky, V.; Dorfman, A.; Rumcheva, P.
- Making use of Internet interactivity to propose a dynamic presentation of web questionnaires; 2016; Revilla, M.; Ochoa, C.; Turbina, A.
- Helping respondents provide good answers in Web surveys; 2016; Couper, M. P.; Zhang, C.
- Gamifying. Not all fun and games; 2016; Stubington, P.; Crichton, C.
- FocusVision 2015 Annual MR Technology Report; 2016; Macer, T., Wilson, S.
- Are sliders too slick for surveys?; 2016; Buskirk, T. D.
- Research gamification for quality pharmaceutical stakeholder insights; 2016; Mondry, B.; Fink, L.
- SurveyTester from Knowledge Navigators ; 2016; Macer, T.
- Simplifying your mobile solution; 2016; Berry, K.
- Effects of motivating question types with graphical support in multi channel design studies; 2016; Luetters, H.; Friedrich-Freksa, M.; Vitt, SGoldstein, D. G.
- Why Do Web Surveys Take Longer on Smartphones?; 2016; Couper, M. P.; J. J.Peterson, G. J.
- Usability Testing within Agile Process; 2016; Holland, T.
- Association of Eye Tracking with Other Usability Metrics ; 2016; Olmsted, E. L.
- Cognitive Probing Methods in Usability Testing – Pros and Cons; 2016; Nichols, E. M.
- Thinking Inside the Box Visual Design of the Response Box Affects Creative Divergent Thinking in an...; 2016; Mohr, A. H.; Sell, A.; Lindsay, T.
- Distractions: The Incidence and Consequences of Interruptions for Survey Respondents ; 2016; Ansolabehere, S.; Schaffner, B. F.
- The Effect of CATI Questions, Respondents, and Interviewers on Response Time; 2016; Olson, K.; Smyth, J. D.
- New Generation of Online Questionnaires?; 2016; Revilla, M.; Ochoa, C.; Turbina, A.
- The Analysis of Respondent’s Behavior toward Edit Messages in a Web Survey; 2016; Park, Y.
- Effects of Data Collection Mode and Response Entry Device on Survey Response Quality; 2016; Ha, L.; Zhang, Che.; Jiang, W.
- Navigation Buttons in Web-Based Surveys: Respondents’ Preferences Revisited in the Laboratory; 2016; Romano Bergstrom, J. C.; Erdman, C.; Lakhe, S.
- Online Surveys are Mixed-Device Surveys. Issues Associated with the Use of Different (Mobile) Devices...; 2016; Toepoel, V.; Lugtig, P. J.
- A Technical Guide to Effective and Accessible web Surveys; 2016; Baatard, G.
- The Validity of Surveys: Online and Offline; 2016; Wiersma, W.
- Computer-assisted and online data collection in general population surveys; 2016; Skarupova, K.
- A Framework of Incorporating Thai Social Networking Data in Online Marketing Survey; 2016; Jiamthapthaksin, R.; Aung, T. H.; Ratanasawadwat, N.
- Creation and Usability Testing of a Web-Based Pre-Scanning Radiology Patient Safety and History Questionnaire...; 2016; Robinson, T. J.; DuVall, S.; Wiggins III, R
- Comprehension and engagement in survey interviews with virtual agents; 2016; Conrad, F. G.; Schober, M. F.; Jans, M.; Orlowski, R. A.; Nielsen, D.; Levenstein, R. M.
- Taming Big Data: Using App Technology to Study Organizational Behavior on Social Media; 2015; Bail, C. A.
- A Meta-Analysis of Breakoff Rates in Mobile Web Surveys; 2015; Mavletova, A. M.; Couper, M. P.
- Optimizing the Decennial Census for Mobile – A Case Study; 2015; Nichols, E. M.; Hawala, E. O.; Horwitz, R.; Bentley, M.
- Using Video to Reinvigorate the Open Question; 2015; Cape, P.
- Are Sliders Too Slick for Surveys? An Experiment Comparing Slider and Radio Button Scales for Smartphone...; 2015; Aadland, D.; Aalberg, T.
- Web Surveys Optimized for Smartphones: Are there Differences Between Computer and Smartphone Users?; 2015; Andreadis, I.
- Designing web surveys for the multi-device internet; 2015; de Bruijne, M.
- Data Quality Standards in Mixed Mode Surveys; 2015; Bremer, J.; Barbulescu, M.; Bennett, J.
- Changing from CAPI to CAWI in an ongoing household panel - experiences from the German Socio-Economic...; 2015; Schupp, J.; Sassenroth, D.
- Rating Scales in Web Surveys: A Test of New Drag-and-Drop Rating Procedures; 2015; Kunz, T.
- A Review of Issues in Gamified Surveys; 2015; Keusch, F.; Zhang, Che.